Childlight is pleased to launch the 2025 edition of its Into the Light Index on Global Child Sexual Exploitation and Abuse (ITL Index 2025), which includes estimates on the scale of child sexual exploitation and abuse (CSEA) in countries across Western Europe and South Asia. This country-level approach provides governments, civil society and other actors with evidence that is tailored and context specific. Over the coming years, this approach will be expanded to other regions, with our 2026 edition already set to focus on countries in North America, Latin America and the Caribbean, and East Asia and the Pacific.
ITL Index 2025 is an evolution from that published in 2024: we have deepened our data to country level estimates, included both technology-facilitated and in-person (offline) abuse, and integrated new data sources. Rather than simply explaining the data shown in the Index, this new supplemental thematic analysis report highlights the cross-cutting trends, patterns and insights that the Index uncovers.
This executive summary provides an overview of these findings, which are discussed in more detail in the full report. It can be read as a standalone document, or alongside our online products: our Interactive Index Dashboard, technical note and open access data archive for greater depth.
We hope that whatever your role, you find this Index a powerful tool to catalyse data-driven change – because children can’t wait.
ITL Index 2025 expands the number and type of data sources drawn into our analysis to include data from representative surveys, policing, helplines and child sexual abuse material (CSAM). The key findings presented below are linked to the thematic sections of this report.
This year, we explored both in-person (offline) and technology-facilitated abuse across multiple data sources, with a focus on prevalence, frontline data and Child Sexual Abuse Material (CSAM) indicators for countries in Western Europe and South Asia. These two regions were identified based on our ITL Index 2024, in which both regions showed a high prevalence of technology-facilitated child sexual exploitation and abuse (TF-CSEA) and/or CSAM, but with differing demographic and data landscapes.
An estimated 4.7% of children in Western Europe were reported as experiencing rape before the age of 18, and 7.4% as being sexually assaulted, with a higher prevalence of abuse among females in both of these figures.
For TF-CSEA, we estimate that 19.6% of children have experienced online solicitation before the age of 18, and 13.5% in the past year. This subtype of TF-CSEA was well represented by data sources for both lifetime before age 18 and past year recall and showed good geographical coverage. In terms of child sexual abuse material/image-based sexual abuse (CSAM/IBSA), 2.5% and 2.2% of children reported this type of TF-CSEA, in relation to lifetime before age 18 and past year prevalence, respectively. Western European countries also frequently reported prevalence estimates of exposure to unwanted sexual content. Before the age 18, this type of exposure was experienced by 6.7% of children and by 20.2% in the past year.
Gender-based differences in TF-CSEA are evident throughout the data for Western Europe. Online solicitation was reported by 17.9% of females and 11% of males during the past year. These gender-based disparities were even more apparent when examining lifetime before age 18 prevalence, which showed an average prevalence of 26.3% for females and 14.7% for males. A relatively small difference between males and females was found in experience of CSAM/IBSA, both in lifetime before age 18 and past year. However, again females appear to be more affected than males by this type of harm, which is in line with recent findings. Conversely, more males than females were exposed to unwanted sexual content in the past year.
Western Europe was the internet host for the majority of CSAM in 2023 and 2024. In 2024, the Netherlands accounted for a disproportionate amount of CSAM, with the highest CSAM volume (over 60% of all reported CSAM from Western Europe was associated with sites in the Netherlands) and the highest CSAM availability rate (880.9 reports/notices per 10,000 population) in the region.
An estimated 12.5% of children in South Asia reported experiencing rape or sexual assault before the age of 18, with the prevalence being higher for females than males. These estimates cannot be taken as representative of the true scale of the problem, given the limited availability of representative survey data and its wide uncertainty in reported estimates on child sexual abuse in South Asia, especially on TF-CSEA. Other than India, which is commended for its level of transparency in police data, the lack of available data and differences in recording systems across the region also create substantial gaps and inconsistencies, affecting our ability to estimate and compare the prevalence of abuse.
Within the region, India, Bangladesh and Pakistan have the highest volume of CSAM reports and together account for nearly all reports in the region, primarily based on reports by the National Center for Missing and Exploited Children (NCMEC). However, if we take population size into account, it is the Maldives that has the highest CSAM availability rate of any country within the region with 94 reports/notices per 10,000 population in 2024, followed by Bangladesh with 64.1 reports/notices per 10,000 population.
This section of the findings explores patterns of CSEA from the data including the link between online and offline abuse with a focus on familial abuse data, sex and/or gender differences in CSEA data, and youth-produced imagery. These findings support more effective prevention and response strategies.
Through our analysis of representative surveys, we explored the different types of perpetrators – an under-researched area within the field of CSEA.
Our analysis of Western Europe and global data highlights familial CSEA as a key issue. For example, nearly 1 in 13 children in Western Europe (7.6%) have experienced sexual assault by a family member during their lifetime before age 18. These estimates should be interpreted with caution, as they are drawn from a small subset of surveys, include wide ranges of uncertainty, and are notably absent in South Asia. At the same time, they highlight the importance of strengthening data foundations in this critical but under-measured area. As well as in-person (offline) abuse, familial abuse also increases the amount of new CSAM being created. Our analysis of global data from NCMEC on law enforcement action shows that when images and videos are produced of the sexual abuse of children, these are largely created by people known to the child, with the nuclear family members being the most frequent creators of CSAM as highlighted in this law enforcement action dataset. However, we do not have comparable data that shows whether the same pattern is followed in CSAM distribution and possession.
Across nearly all of our ITL Index 2025 indicators, particularly lifetime before age 18 experiences, CSEA is more prevalent against females than males in Western Europe. This is similar when we look at experiences in the past year, although the gap between females and males is smaller than for lifetime experiences. In the specific category of unwanted exposure to sexual content, males report a higher past year prevalence than females. This is consistent with existing research on forced or unintentional exposure to pornography among children defining this type of harm as a violation of their rights and highlighting negative impacts on their wellbeing. There are, however, large data gaps in South Asia, which has a lack of representative survey data disaggregated by sex and/or gender. This means that it is difficult to understand the prevalence of CSEA by sex and/or gender in the region.
CSAM metadata analysis conducted for ITL Index 2025 shows an overall increase in ‘self-generated’ content, whether through public reporting, victim reports, analyst directed searches or identified by web crawlers. When youth-produced images circulate online the purpose for their generation is often difficult to ascertain. It may reflect an increase in abusive or harmful behaviour between children/peers. It may also reflect the abuse of children by adults, stemming from online solicitation, non-consensual taking and sharing of images and videos, and through sexual extortion. The imagery of this kind challenges previously held definitions for CSAM and requires a wider understanding of what sexually abusive material of children is.
The scale and nature of TF-CSEA reports are shaped by many factors, including societal perceptions and how these influence policy and practice. Debates on policy decisions which make their way into the public sphere, such as end-to-end encryption (E2EE) and online safety regulations (See sections 3.2 & 3.3), in turn influence both the risks children take and the ability to tackle and prevent abuse. Technological change can improve the ability for abuse to be disclosed, but it can also contribute to wider structural and systemic factors that enable CSEA to persist.
Too often, CSEA is only defined and tracked through the narrow lens of national criminal law, which varies widely from country to country. A universalist approach to CSEA, in which a consistent set of measures is applied to CSEA regardless of a country’s legal framework, allows for the comparison of harm across national borders. This means using common indicators, categories, definitions and typologies so that data can be aggregated and compared internationally. There is still a large evidence gap in this area on how questions and definitions in the CSEA field are interpreted across cultures and countries.
Decisions made by technology companies, governments and regulatory bodies can lead to an increase or decrease in the amount of CSAM that is reported to or discovered by the leading CSAM organisations. This is demonstrated, for example, through the implementation of E2EE in communications data as the default setting. Between 2023 and 2024, all organisations tasked with CSAM data collection showed fluctuations in the amount of CSAM they assessed, but to differing proportions and in differing directions.
The year 2022 saw a wave of legislative changes aiming to put parameters around the responsibility of technology platforms to ensure user safety around the world, with new legislation introduced in the European Union (EU), Australia, the United Kingdom (UK) and the United States (US). ITL Index 2025 data for Western Europe suggests that, for countries that have recently enacted regulations, these regulations are helping CSAM content to be identified and appear to be targeting the hosted content. This comes from early insights, as shown in the data for these countries; a more thorough evaluation of the impact of regulation is needed to eliminate alternative scenarios driving the changes in data.
An increasingly prevalent example of technological change influencing how people offend against children online is CSAM generated by artificial intelligence (AI), known as AI CSAM. The volume of AI CSAM is increasing across all data sources that track this as a specific category of material. Analysis of the severity of AI CSAM suggests this material is often of the more severe categories, almost completely depicting female children. As AI CSAM can be manipulated to the specifications of its creator, the increasing severity noted by Internet Watch Foundation (IWF), and its implications for female victims, should be of note to the child protection sector, as it may suggest interest in obtaining this type of material.
The amount of time taken to remove CSAM content differs across take-down organisations and data sources. This is influenced by an increase in overall CSAM volume, as well as platform decisions, such as the use of E2EE by default. Despite the best efforts of those working to remove CSAM, the length of time between CSAM being first sighted and taken down is increasing, which means that perpetrators have more time to download and re-share reported images. Known CSAM continues to circulate, which can be seen through an increase in the proportion of CSAM circulating that is already ‘known’.
Child helplines can be a lifeline to those seeking support or advice. However, there are barriers to accessing these services, such as connectivity, language, stigma, concerns about being traced, and a lack of awareness that these services even exist. Moreover, cultural factors influence outreach to child helplines, such as religious-cultural norms and beliefs which may influence whether or not families, individuals and/or children feel comfortable seeking help, or may even view such services as unacceptable.
What, who and how we measure, matters. CSEA data measurement needs to be strengthened, from the very beginning – with how harm is reported, recorded and counted – all the way through to how that same data is contextualised, analysed and interpreted to make change.
Reliable data on CSEA is essential for effective prevention, protection and policy-making. However, differences in how cases are reported, recorded, and counted across jurisdictions and platforms can lead to inconsistencies in official statistics. These variations, shaped by differing legal frameworks, administrative practices and crime counting rules, can obscure the true scale of harm.
Many extenuating factors can impact on the volume of CSAM reports received in a year. One of these contextual factors is organisational change and the impact that this can have, particularly on big data such as CSAM data. For example, there was a drastic impact on the number of reports received by the Association of Internet Hotline Providers (INHOPE) in 2025 due to the development of three new hotlines.
No single data source can give us a complete picture of CSEA. Prevalence surveys, police records and hotline data, and big data on CSAM all have gaps and may underestimate the true scale of CSEA in some sources of data and potentially overestimate it in others. For example, gaps exist regarding data on children under five years old, children in conflict zones, marginalised groups, children affected by online abuse and countries with no relevant legislation.
Although ITL Index 2025 draws on multiple data sources, there are still gaps in reliable, representative and comparable data. These challenges in data coverage and availability are exacerbated as existing data sources from established data owners decline, high-quality data from gold-standard representative surveys by countries are lacking, and inconsistencies persist in how administrative data is captured and reported, even within a country.
We know that the scale of abuse, as well as how much needs to be done to better protect children, can be overwhelming. However, we also know that there is a pathway to reaching this impact. This is a pathway has many steps, some of which need to be taken together in collaboration. Below we set out some practical steps, organised under action areas, which, while not being the end goal, show a tangible way to progress towards a safer future for children. Because Childlight is part of that journey, we also highlight where we are catalysing, collaborating and contributing to much-needed global change.
We ask governments to ensure that law enforcement agencies have access to referrals from key reporting bodies, such as the NCMEC and INTERPOL, among others, and the ability to triage those referrals to identify children and remove CSAM. This reflects our understanding that in some countries such agencies may face serious challenges in terms of data access, supportive legislation, training or resources to act on CSEA intelligence. Specifically, we ask for prioritised support in the Netherlands and the Maldives, which have high rates of CSAM reports per 10,000 population, and India, which has a high volume of CSAM overall.
We commit to working with countries to understand their current ability to access, triage, prioritise and use CSEA data, through our Childlight Technical Advisory Programme (C-TAP). We commit to providing targeted support and advice for high priority countries that show a willingness to improve their capability – with support for the Netherlands, the Maldives, India and Pakistan underway. We also commit to further research country contexts where CSAM rates are disproportionate to help support the identification of root causes for prevention and response.
We ask that when a country survey is being designed or when CSAM data is being collected and analysed it includes categorisation of perpetrator type including familial abuse, where possible, to address data gaps in this area. Perpetrator type can be captured through two approaches: NCMEC data and surveys that disaggregate perpetrator categories.
We commit to the continued analysis and disaggregation of data to shine a light on the prevalence of familial abuse, exploring this through work with survivor groups and specialists researchers to explore developing specific indicators in the 2026 edition of our ITL Index on Global CSEA.
We ask that every country funds and implements a representative victimisation survey, to fill existing data gaps. Specifically, we ask for greater data collection in South Asia, where there is very little CSEA data from other sources. This should include a common approach to typologies to capture both in-person and technology-facilitated CSEA. An investment in training and technology to capture child helpline data will yield more detailed and comparable help-seeking data from under-researched areas. National surveys should be complemented by publicly available crime statistics and child helpline data for CSEA that include age, gender and outcomes.
We commit to identifying novel data sources and methodologies that can fill data gaps and contribute to country-level data on CSEA, especially where traditional survey data is lacking – and to making these indicators publicly available through our Index. For example, early scoping has indicated that for the East Asia and Pacific region, which is one of the regional priority areas for 2026, there will be limited data from the Pacific Islands. To address this, we will offer deep-dive analyses into Fiji and Papua New Guinea and explore working with data partners across other remote, rural, small population countries in our ITL Index 2026.
We ask that countries uphold the best interests of the child and establish legislation that gives power to a governing body to set child-centric, gender-sensitive and inclusive standards for the safety of children in online spaces, as well as consequences if these standards are not met. Countries should reflect on the regulations in place in the EU, UK and Australia as a starting point on how to both protect children online and put legal provisions and systems in place to hold accountable those who facilitate abuse. Legislation and regulation of online spaces requires an even-handed approach accompanied by increased investment in developing technological innovation. This innovation must ensure that users private data is protected, while also allowing for the investigation and prevention of online harms. There is more work to be done, with legislators and regulators having a difficult task ahead as they implement policies aimed to create greater safety for all and critically evaluate those efforts.
We commit to conducting evaluation research to better understand the impact of regulation on child safety across different legislative and regulatory frameworks. We also commit to sharing our CSEA prevalence and nature research with national regulators, such as Ofcom (UK), the E-Safety Commissioner (Australia) and Coimisiún na Meán (Ireland), among others. We also commit to using data to support governments to establish legislation in countries where it does not exist, to evaluating existing legislative and regulatory frameworks, and to continuing our research into AI CSAM accountability in legislation with new countries. We commit to providing research that is without fear or favour, but always in the interest of children, through our membership of groups such as the Global Online Safety Regulation Network.
We ask that when CSEA data is collected, it records both sex and gender. This will allow connection to the wider field of gender-based violence research (e.g., female genital mutilation, child marriage) and prevention programming, ensuring that support is calibrated for both females and males.
We commit to continue to include a disaggregation of data by sex and gender, depending on the data source, in our ITL Index and upcoming editions of Searchlight – our biannual publication examining the nature of CSEA. We also commit to seeking funding to develop a doctoral student training network with a consortium of partners on technology-facilitated sexual and gender-based violence to further the field by bringing innovative methodological approaches and learning to CSEA from the violence against women field, and vice versa, as well as linking academic research to policy and practice improvements.
We ask that the lived experience of survivors is included in the designing and setting of national policy in CSEA. This includes consideration of schemes to provide restitution, redress, justice and healing for survivors of CSEA, including holding those who commit or facilitate abuse to account.
We commit to using our Justice Beyond Borders research, a legislative analysis of 28 countries on TF-CSEA cross-border survivor restitution, to highlight the need for an international pathway to a global restitution scheme. We commit to working with partners and supporting research on how global monetary funds operate and how such work could connect with the UN Committee on the Rights of the Child.
In the spirit of a shared vision and collaboration to protect children and prevent harm, we hope that you find Childlight’s ITL Index 2025 both insightful and useful. If you use our research to catalyse or inform change for children, we would love to hear from you. Please let us know by writing to childlight@ed.ac.uk.
We also welcome feedback on our work and other opportunities to improve and enhance the Index. We want to make sure that our resources are useful in your practice, because without you, our insight cannot be translated into much-needed action for children.